# Russian Text Generation
Rugpt3medium Based On Gpt2
A Russian pretrained language model based on the GPT-2 architecture, developed by the SberDevices team, supporting a sequence length of 1024 and trained on 80 billion tokens.
Large Language Model
Transformers Other

R
ai-forever
9,710
21
SBERBANK RUS
The Russian version of GPT-2 is a text generation model developed based on OpenAI's GPT-2 architecture, specifically optimized and trained for Russian text.
Large Language Model
Transformers Other

S
Mary222
16
2
Featured Recommended AI Models